Dictionary
Markov process
noun
Definition of MARKOV PROCESS
: a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also : markov chain —called also Markoff process
ADVERTISEMENT
First Known Use of MARKOV PROCESS
1938
Learn More About MARKOV PROCESS
Browse
Next Word in the Dictionary: marksmanPrevious Word in the Dictionary: MarkovianAll Words Near: Markov process
ADVERTISEMENT
Seen & Heard
What made you want to look up Markov process? Please tell us where you read or heard it (including the quote, if possible).